Theory of Computation

Complete Engineering Notes, Practice Lab & Exam Preparation

Master computational theory with our comprehensive resource featuring complete lecture notes, interactive practice environment, 200+ solved questions, step-by-step explanations, visual automata simulator, and exam-focused content. Everything you need to excel in TOC - from basic concepts to advanced proofs and problem solving.

Complete Theory Notes

Comprehensive lecture notes with detailed explanations

Unit 1: Regular Languages and Finite Automata

1.1 Basic Definitions

Alphabet (Ξ£): A finite non-empty set of symbols or characters.

Example: Ξ£ = {0, 1} (binary alphabet), Ξ£ = {a, b, c, ..., z} (English alphabet)

String: A finite sequence of symbols from an alphabet.

  • Empty string (Ξ΅ or Ξ»): String with no symbols, length = 0
  • Length |w|: Number of symbols in string w
  • Concatenation: w₁wβ‚‚ means string w₁ followed by string wβ‚‚

Language (L): A set of strings over an alphabet Ξ£. L βŠ† Ξ£*

  • βˆ… (empty language): Contains no strings
  • {Ξ΅} (language containing only empty string)
  • Ξ£* : Set of all possible strings over Ξ£
  • Σ⁺ : Set of all non-empty strings over Ξ£

1.2 Finite Automata

Deterministic Finite Automaton (DFA):

M = (Q, Ξ£, Ξ΄, qβ‚€, F) where:

  • Q: Finite set of states
  • Ξ£: Input alphabet
  • Ξ΄: Q Γ— Ξ£ β†’ Q (transition function)
  • qβ‚€ ∈ Q: Initial/start state
  • F βŠ† Q: Set of final/accepting states
Ξ΄*(q, Ξ΅) = q
Ξ΄*(q, wa) = Ξ΄(Ξ΄*(q, w), a)

1.3 Non-Deterministic Finite Automaton (NFA)

M = (Q, Ξ£, Ξ΄, qβ‚€, F) where Ξ΄: Q Γ— Ξ£_Ξ΅ β†’ P(Q)

  • Can have multiple transitions for same input
  • Can have Ξ΅-transitions (empty string transitions)
  • P(Q) is power set of Q (set of all subsets of Q)

1.4 Regular Expressions

Algebraic notation to describe regular languages:

  • βˆ…: Empty language
  • Ξ΅: Language containing only empty string
  • a: Language containing only string "a"
  • R₁ βˆͺ Rβ‚‚: Union (R₁ | Rβ‚‚)
  • R₁ ∘ Rβ‚‚: Concatenation (R₁Rβ‚‚)
  • R*: Kleene star (zero or more repetitions)
  • R⁺: One or more repetitions (RR*)

1.5 Equivalence Theorem

Theorem: The following are equivalent for any language L:

  1. L is accepted by some DFA
  2. L is accepted by some NFA
  3. L is described by some regular expression

These languages are called Regular Languages.

1.6 Pumping Lemma for Regular Languages

Statement: If L is regular, then βˆƒp β‰₯ 1 such that βˆ€s ∈ L with |s| β‰₯ p, s = xyz where:

  1. |y| > 0
  2. |xy| ≀ p
  3. βˆ€i β‰₯ 0, xy^i z ∈ L

Use: To prove a language is NOT regular (proof by contradiction)

1.7 Closure Properties

Regular languages are closed under:

  • Union: L₁ βˆͺ Lβ‚‚
  • Intersection: L₁ ∩ Lβ‚‚
  • Concatenation: L₁Lβ‚‚
  • Kleene Star: L*
  • Complement: LΜ„ = Ξ£* - L
  • Difference: L₁ - Lβ‚‚
  • Reversal: L^R
Complete Example: DFA Construction

Problem: Construct DFA for L = {w ∈ {0,1}* | w has even number of 0s and odd number of 1s}

State Representation:

qβ‚€β‚€
β†’1β†’
q₀₁


q₁₀
β†’1β†’
q₁₁

States: {qβ‚€β‚€, q₀₁, q₁₀, q₁₁}

  • qβ‚€β‚€: even 0s, even 1s (start state)
  • q₀₁: even 0s, odd 1s (final state)
  • q₁₀: odd 0s, even 1s
  • q₁₁: odd 0s, odd 1s

Transition Table:

State Input 0 Input 1
qβ‚€β‚€ q₁₀ q₀₁
q₀₁ q₁₁ qβ‚€β‚€
q₁₀ qβ‚€β‚€ q₁₁
q₁₁ q₀₁ q₁₀

Unit 2: Context-Free Languages and Pushdown Automata

2.1 Context-Free Grammar (CFG)

G = (V, Ξ£, R, S) where:

  • V: Finite set of variables (non-terminals)
  • Ξ£: Finite set of terminals (V ∩ Ξ£ = βˆ…)
  • R: Finite set of production rules
  • S ∈ V: Start variable

Production Rule: A β†’ Ξ± where A ∈ V and Ξ± ∈ (V βˆͺ Ξ£)*

2.2 Derivations

  • Leftmost derivation: Always replace leftmost variable first
  • Rightmost derivation: Always replace rightmost variable first
  • Parse Tree: Tree representation of derivation

2.3 Ambiguity

A CFG is ambiguous if some string has multiple parse trees.

A language is inherently ambiguous if every CFG for it is ambiguous.

2.4 Chomsky Normal Form (CNF)

Every production is of the form:

  • A β†’ BC (where A, B, C are variables)
  • A β†’ a (where a is terminal)
  • S β†’ Ξ΅ (only if Ξ΅ ∈ L(G))

2.5 Pushdown Automaton (PDA)

M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, Zβ‚€, F) where:

  • Q: Finite set of states
  • Ξ£: Input alphabet
  • Ξ“: Stack alphabet
  • Ξ΄: Q Γ— Ξ£_Ξ΅ Γ— Ξ“_Ξ΅ β†’ P(Q Γ— Ξ“_Ξ΅): transition function
  • qβ‚€: Start state
  • Zβ‚€: Initial stack symbol
  • F: Set of final states

2.6 Equivalence of CFG and PDA

Theorem: A language L is context-free iff it is accepted by some PDA.

2.7 Pumping Lemma for CFLs

If L is context-free, then βˆƒp such that βˆ€s ∈ L with |s| β‰₯ p, s = uvwxy where:

  1. |vwx| ≀ p
  2. |vx| > 0
  3. βˆ€i β‰₯ 0, uv^i wx^i y ∈ L

2.8 Closure Properties of CFLs

Closed under: Union, Concatenation, Kleene Star

NOT closed under: Intersection, Complement

Complete Example: CFG to PDA Conversion

Problem: Convert CFG for {0ⁿ1ⁿ | n β‰₯ 0} to PDA

CFG:

S β†’ 0S1 | Ξ΅

PDA Construction:

  1. Push all 0s onto stack
  2. For each 1, pop a 0 from stack
  3. Accept if stack is empty at end

Transitions:

  • Ξ΄(qβ‚€, 0, Zβ‚€) = {(qβ‚€, 0Zβ‚€)}
  • Ξ΄(qβ‚€, 0, 0) = {(qβ‚€, 00)}
  • Ξ΄(qβ‚€, 1, 0) = {(q₁, Ξ΅)}
  • Ξ΄(q₁, 1, 0) = {(q₁, Ξ΅)}
  • Ξ΄(q₁, Ξ΅, Zβ‚€) = {(qβ‚‚, Zβ‚€)}

Unit 3: Turing Machines and Computability

3.1 Turing Machine Definition

M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, q_accept, q_reject) where:

  • Q: Finite set of states
  • Ξ£: Input alphabet (blank βˆ‰ Ξ£)
  • Ξ“: Tape alphabet (Ξ£ βŠ‚ Ξ“, blank ∈ Ξ“)
  • Ξ΄: Q Γ— Ξ“ β†’ Q Γ— Ξ“ Γ— {L,R}: transition function
  • qβ‚€ ∈ Q: Start state
  • q_accept ∈ Q: Accept state
  • q_reject ∈ Q: Reject state (β‰  q_accept)

3.2 Variants of Turing Machines

  • Multi-tape TM: Multiple tapes, same computational power
  • Non-deterministic TM: Multiple possible transitions
  • Enumerator: Prints strings of language
  • Two-way infinite tape: Tape extends infinitely in both directions

3.3 Church-Turing Thesis

Intuitive notion of "algorithm" = Turing Machine computation

Cannot be proven (it's a thesis), but universally accepted.

3.4 Decidable Languages

A language L is decidable if some TM decides it.

A TM decides L if it accepts all strings in L and rejects all strings not in L.

Examples of decidable languages:

  • A_DFA = {⟨B,w⟩ | B is DFA that accepts w}
  • A_CFG = {⟨G,w⟩ | G is CFG that generates w}
  • E_DFA = {⟨A⟩ | A is DFA and L(A) = βˆ…}
  • EQ_DFA = {⟨A,B⟩ | A,B are DFAs and L(A) = L(B)}

3.5 Undecidable Languages

Halting Problem: A_TM = {⟨M,w⟩ | M is TM that accepts w}

Theorem: A_TM is undecidable.

Proof by diagonalization:

  1. Assume H decides A_TM
  2. Construct TM D using H
  3. D(⟨M⟩): if H accepts ⟨M,⟨M⟩⟩ then reject, else accept
  4. What does D do on input ⟨D⟩? Contradiction!

3.6 Reducibility

Language A is reducible to B (A ≀_m B) if there exists computable function f such that:

w ∈ A ⟺ f(w) ∈ B

If A ≀_m B and B is decidable, then A is decidable.

If A ≀_m B and A is undecidable, then B is undecidable.

3.7 Rice's Theorem

Every non-trivial property of Turing-recognizable languages is undecidable.

A property P is trivial if P = βˆ… or P = all TM languages.

Complete Example: Turing Machine for {0ⁿ1ⁿ2ⁿ | n β‰₯ 0}

Algorithm:

  1. Scan right, mark first 0 as X
  2. Continue right, mark first 1 as Y
  3. Continue right, mark first 2 as Z
  4. Return to start, repeat
  5. Accept if all symbols properly marked

Formal Description:

State q₁: Scan right for first 0
   If 0 found: Mark as X, go to qβ‚‚
   If Y or Z found: go to qβ‚… (check phase)
   If blank: accept

State qβ‚‚: Scan right for first 1
   If 1 found: Mark as Y, go to q₃
   If 2 or blank: reject

State q₃: Scan right for first 2
   If 2 found: Mark as Z, go to qβ‚„
   If blank: reject

State qβ‚„: Return to start
   Move left until start marker
   Go to q₁

State qβ‚…: Final check
   If only X,Y,Z remain: accept
   Otherwise: reject
                            

Unit 4: Computational Complexity Theory

4.1 Time Complexity

Time complexity f(n) = maximum number of steps TM uses on input of length n.

TIME(f(n)): Collection of languages decidable by TM running in time O(f(n))

4.2 Complexity Classes

  • P: ⋃_{k} TIME(n^k) - polynomial time
  • EXPTIME: ⋃_{k} TIME(2^{n^k}) - exponential time
  • NP: Languages verifiable in polynomial time
  • NTIME(f(n)): Languages decidable by NTM in time O(f(n))

4.3 Verifiers

A verifier for language A is algorithm V such that:

A = {w | V accepts ⟨w,c⟩ for some certificate c}

Polynomial verifier: V runs in polynomial time in |w|

NP = {L | L has polynomial verifier}

4.4 NP-Completeness

Language B is NP-complete if:

  1. B ∈ NP
  2. Every A ∈ NP is polynomial-time reducible to B

Cook-Levin Theorem: SAT is NP-complete

4.5 Common NP-Complete Problems

  • SAT: Boolean satisfiability
  • 3SAT: 3-CNF satisfiability
  • CLIQUE: k-clique problem
  • VERTEX-COVER: Vertex cover problem
  • HAMPATH: Hamiltonian path
  • TSP: Traveling salesman problem
  • SUBSET-SUM: Subset sum problem

4.6 Space Complexity

SPACE(f(n)): Languages decidable by TM using O(f(n)) space

  • PSPACE: ⋃_{k} SPACE(n^k)
  • NPSPACE: ⋃_{k} NSPACE(n^k)
  • L: SPACE(log n) - logarithmic space
  • NL: NSPACE(log n)

4.7 Hierarchy Theorems

Time Hierarchy: If f(n) log f(n) = o(g(n)), then TIME(f(n)) ⊊ TIME(g(n))

Space Hierarchy: If f(n) = o(g(n)), then SPACE(f(n)) ⊊ SPACE(g(n))

4.8 Relationships Between Classes

L βŠ† NL βŠ† P βŠ† NP βŠ† PSPACE = NPSPACE βŠ† EXPTIME

Savitch's Theorem: NSPACE(f(n)) βŠ† SPACE(f(n)Β²)

NP-Completeness Proof: 3SAT ≀_p CLIQUE

Problem: Prove CLIQUE is NP-complete

Step 1: Show CLIQUE ∈ NP

Given graph G and k, certificate is a set of k vertices. Verify in O(kΒ²) time that all pairs are connected.

Step 2: Show 3SAT ≀_p CLIQUE

Given 3SAT formula Ο† with m clauses:

  1. Create vertex for each literal in each clause
  2. Connect vertices if: (a) from different clauses, (b) not contradictory literals
  3. Ο† is satisfiable ⟺ G has m-clique

Example: Ο† = (x₁ ∨ xβ‚‚ ∨ x₃) ∧ (x̄₁ ∨ xΜ„β‚‚ ∨ xβ‚„)

Creates 6 vertices, connections between non-contradictory literals from different clauses.

Interactive Practice Lab

Test your understanding with interactive exercises

DFA String Acceptance Tester

Enter a string to test if it's accepted by the DFA for "strings ending with 01":

Results will appear here...

Regular Expression Matcher

Test if strings match common regular expressions:

Results will appear here...

Pumping Lemma Validator

Practice using pumping lemma to prove languages are not regular:

Select a language to see pumping lemma proof...

Question & Answer Bank

200+ Solved Questions with Detailed Explanations

Unit 1: Regular Languages

Q1: What is the difference between DFA and NFA? Explain with examples.

Click to reveal answer β†’

Answer:

Deterministic Finite Automaton (DFA):

  • For each state and input symbol, exactly one transition
  • No Ξ΅-transitions allowed
  • Easier to implement in hardware/software
  • Ξ΄: Q Γ— Ξ£ β†’ Q

Non-Deterministic Finite Automaton (NFA):

  • For each state and input symbol, zero or more transitions
  • Ξ΅-transitions allowed
  • More intuitive for design
  • Ξ΄: Q Γ— Ξ£_Ξ΅ β†’ P(Q)

Example: Language L = {strings over {0,1} containing "01"}

NFA: 3 states with Ξ΅-transition. DFA: Requires subset construction.

Q2: Prove that the language L = {0ⁿ1ⁿ | n β‰₯ 0} is not regular using pumping lemma.

Click to reveal answer β†’

Proof by Contradiction:

  1. Assume L is regular with pumping length p β‰₯ 1
  2. Choose string s = 0α΅–1α΅– ∈ L with |s| = 2p β‰₯ p
  3. By pumping lemma: s = xyz where |xy| ≀ p, |y| > 0, xy^i z ∈ L for all i β‰₯ 0
  4. Since |xy| ≀ p: Both x and y consist only of 0s
  5. Let |y| = k > 0: y = 0^k
  6. Pumping with i = 0: xz = 0^{p-k}1^p
  7. Since k > 0: p-k < p, so string has fewer 0s than 1s
  8. Therefore: xz βˆ‰ L, contradicting pumping lemma
  9. Conclusion: L is not regular ∎
Q3: Convert the regular expression (0+1)*01 to an equivalent NFA.

Click to reveal answer β†’

Step-by-step Construction:

  1. Break down: (0+1)*01 = (0+1)* Β· 0 Β· 1
  2. Construct NFA for (0+1)*:
    • State qβ‚€ with self-loops on 0 and 1
    • qβ‚€ is both start and final state
  3. Construct NFA for 01:
    • q₁ β†’0β†’ qβ‚‚ β†’1β†’ q₃
    • q₃ is final state
  4. Combine using concatenation:
    • Add Ξ΅-transition from final of (0+1)* to start of 01
    • Remove final status from (0+1)* part

Final NFA: States {qβ‚€, q₁, qβ‚‚}, transitions preserve the pattern recognition for strings ending in 01.

Unit 2: Context-Free Languages

Q4: Design a CFG for the language L = {aⁱbΚ²cᡏ | i = j or j = k, i,j,k β‰₯ 0}.

Click to reveal answer β†’

Analysis: Union of two languages - L₁ (i=j) and Lβ‚‚ (j=k)

Grammar G:

S β†’ S₁ | Sβ‚‚

S₁ β†’ AB          // For i = j
A β†’ aAb | Ξ΅      // Equal a's and b's
B β†’ cB | Ξ΅       // Any number of c's

Sβ‚‚ β†’ CB          // For j = k  
C β†’ aC | Ξ΅       // Any number of a's
B β†’ bBc | Ξ΅      // Equal b's and c's
                        

Example derivations:

  • aabbc (i=j=2, k=1): S β†’ S₁ β†’ AB β†’ aAbB β†’ aabB β†’ aabbcB β†’ aabbc
  • abbcc (i=1, j=k=2): S β†’ Sβ‚‚ β†’ CB β†’ aCB β†’ aB β†’ abBc β†’ abbBcc β†’ abbcc
Q5: Convert the CFG S β†’ aSb | ab to Chomsky Normal Form.

Click to reveal answer β†’

Step 1: Original grammar generates {aⁿbⁿ | n β‰₯ 1}

Step 2: Eliminate productions with mixed terminals/variables

S β†’ ASB | AB
A β†’ a
B β†’ b
                        

Step 3: Break down productions with >2 variables

S β†’ AC | AB    // where C β†’ SB
A β†’ a
B β†’ b
C β†’ SB
                        

Final CNF:

S β†’ AC | AB
A β†’ a
B β†’ b
C β†’ SB
                        

Unit 3: Turing Machines

Q6: Design a Turing Machine that computes f(x) = 2x where x is in unary.

Click to reveal answer β†’

Input: 1Λ£ (x ones)

Output: 1Β²Λ£ (2x ones)

Algorithm:

  1. For each 1 in input, write two 1s in output area
  2. Mark processed 1s to avoid reprocessing
  3. Use tape efficiently with markers

Detailed Steps:

1. Mark first 1 as X, move right to find blank
2. Write 11 after blank, return to next unmarked 1
3. Repeat until all 1s processed
4. Clean up: replace X's with blanks, position result
5. Accept

Transitions:
Ξ΄(qβ‚€, 1) = (q₁, X, R)      // Mark and move right
Ξ΄(q₁, 1) = (q₁, 1, R)      // Skip over 1s
Ξ΄(q₁, B) = (qβ‚‚, 1, R)      // Write first copy
Ξ΄(qβ‚‚, B) = (q₃, 1, L)      // Write second copy
... (return transitions)
                        
Q7: Prove that the Halting Problem is undecidable using diagonalization.

Click to reveal answer β†’

Theorem: A_TM = {⟨M,w⟩ | M is TM that accepts w} is undecidable.

Proof by Contradiction:

  1. Assume TM H decides A_TM
  2. H behavior:
    • H(⟨M,w⟩) = accept if M accepts w
    • H(⟨M,w⟩) = reject if M rejects w or loops
  3. Construct TM D:
    D on input ⟨M⟩:
    1. Run H on ⟨M,⟨M⟩⟩
    2. If H accepts: REJECT
    3. If H rejects: ACCEPT
                                    
  4. Consider D(⟨D⟩):
    • If D accepts ⟨D⟩ β†’ H rejects ⟨D,⟨D⟩⟩ β†’ D rejects ⟨D⟩ ⚑
    • If D rejects ⟨D⟩ β†’ H accepts ⟨D,⟨D⟩⟩ β†’ D accepts ⟨D⟩ ⚑
  5. Contradiction! Therefore H cannot exist.
  6. Conclusion: A_TM is undecidable ∎

Unit 4: Complexity Theory

Q8: Show that VERTEX-COVER is NP-complete by reducing 3SAT to it.

Click to reveal answer β†’

Step 1: Show VERTEX-COVER ∈ NP

  • Certificate: Set S of vertices
  • Verifier: Check |S| ≀ k and every edge has endpoint in S
  • Time: O(|E|) - polynomial

Step 2: Reduce 3SAT ≀_p VERTEX-COVER

Given: 3SAT formula Ο† with variables x₁,...,xβ‚™ and clauses C₁,...,Cβ‚˜

Construct graph G:

  1. Variable gadget: For each variable xα΅’, add edge (xα΅’, xΜ„α΅’)
  2. Clause gadget: For each clause Cβ±Ό = (ℓ₁ ∨ β„“β‚‚ ∨ ℓ₃), add triangle with vertices ℓ₁, β„“β‚‚, ℓ₃
  3. Connect: Add edges from each literal to its negation in variable gadgets

Set k = n + 2m

Correctness:

  • Ο† satisfiable ⟹ G has vertex cover of size k: Choose one literal per variable (satisfied one), two vertices per clause triangle not satisfied by chosen literals
  • G has vertex cover of size k ⟹ Ο† satisfiable: Must choose exactly one literal per variable, assignment satisfies all clauses
Q9: Explain the relationship between P, NP, and PSPACE classes.

Click to reveal answer β†’

Class Definitions:

  • P: Languages decidable in polynomial time on deterministic TM
  • NP: Languages decidable in polynomial time on non-deterministic TM (or verifiable in polynomial time)
  • PSPACE: Languages decidable using polynomial space

Known Relationships:

P βŠ† NP βŠ† PSPACE

Justification:

  • P βŠ† NP: Deterministic TM is special case of non-deterministic TM
  • NP βŠ† PSPACE: Can simulate NTM by exploring all computation paths using polynomial space
  • PSPACE = NPSPACE: Savitch's theorem

Open Questions:

  • P ?= NP: Most famous open problem
  • NP ?= PSPACE: Also open
  • Known: P β‰  EXPTIME (by time hierarchy theorem)

Practical Implications:

  • If P = NP: Cryptography breaks down, optimization becomes easy
  • Most experts believe P β‰  NP
Q10: What is Cook-Levin theorem and why is it significant?

Click to reveal answer β†’

Cook-Levin Theorem (1971): SAT (Boolean Satisfiability) is NP-complete.

Significance:

  1. First NP-complete problem: Established the concept of NP-completeness
  2. Reduction template: Other problems proven NP-complete by reducing SAT to them
  3. Theoretical foundation: Launched complexity theory as major field

Proof Idea:

  1. SAT ∈ NP: Easy - guess assignment, verify in polynomial time
  2. Every L ∈ NP reduces to SAT:
    • For any L ∈ NP with verifier V
    • Given input x, construct formula Ο†
    • Ο† satisfiable ⟺ βˆƒ certificate c such that V accepts ⟨x,c⟩
    • Ο† encodes computation of V on ⟨x,c⟩

Construction Details:

  • Variables for each cell of V's configuration at each time step
  • Clauses ensure valid initial configuration
  • Clauses ensure valid transitions
  • Clauses ensure accepting final configuration

Impact: Led to discovery of thousands of NP-complete problems across all areas of computer science.

Finite Automata & Regular Languages

Foundation of computational models

πŸ”„

Deterministic Finite Automata (DFA)

A finite state machine that accepts or rejects strings based on a sequence of characters, with exactly one transition per state-symbol pair.

πŸ”€

Non-Deterministic Finite Automata (NFA)

Similar to DFA but can have multiple transitions for the same input symbol, including Ξ΅-transitions (empty string transitions).

πŸ“

Regular Expressions

A powerful notation for describing patterns in strings, equivalent in power to finite automata for recognizing regular languages.

Formal Definition of DFA

A DFA is a 5-tuple M = (Q, Ξ£, Ξ΄, qβ‚€, F) where:

  • Q is a finite set of states
  • Ξ£ is a finite alphabet
  • Ξ΄: Q Γ— Ξ£ β†’ Q is the transition function
  • qβ‚€ ∈ Q is the initial state
  • F βŠ† Q is the set of final states
Example: DFA for strings ending with '01'
qβ‚€ ──0──→ q₁ ──1──→ qβ‚‚ (Final) β”‚ β”‚ ↑ └──1──────┴──0β”€β”€β”€β”€β”€β”€β”˜ States: {qβ‚€, q₁, qβ‚‚} Alphabet: {0, 1} Start State: qβ‚€ Final State: {qβ‚‚}

This DFA accepts strings that end with the pattern '01'. State qβ‚€ is the initial state, q₁ represents having seen a '0', and qβ‚‚ (final) represents having seen '01'.

Algorithm: NFA to DFA Conversion (Subset Construction)
  1. Create initial state: Ξ΅-closure of NFA start state
  2. For each new DFA state and input symbol:
  3. Find all NFA states reachable
  4. Compute Ξ΅-closure of result
  5. Create new DFA state if not exists
  6. Repeat until no new states are created
  7. Mark DFA states as final if they contain any NFA final state
Pumping Lemma for Regular Languages

If L is a regular language, then there exists a positive integer p such that for any string s ∈ L with |s| β‰₯ p, s can be divided into three parts s = xyz such that:

  • |y| > 0 (y is non-empty)
  • |xy| ≀ p
  • For all i β‰₯ 0, xy^i z ∈ L

Context-Free Grammars & Pushdown Automata

Handling nested structures and programming languages

πŸ“

Context-Free Grammars

Formal grammars that can generate context-free languages, essential for parsing programming languages and handling nested structures.

πŸ“š

Pushdown Automata

Finite automata enhanced with a stack memory, providing the computational power needed to recognize context-free languages.

πŸ”§

Normal Forms

Standardized forms of CFGs like Chomsky Normal Form (CNF) and Greibach Normal Form (GNF) for easier analysis and parsing.

Formal Definition of Context-Free Grammar

A CFG is a 4-tuple G = (V, Ξ£, R, S) where:

  • V is a finite set of variables (non-terminals)
  • Ξ£ is a finite set of terminals
  • R is a finite set of rules (productions)
  • S ∈ V is the start variable
Example: CFG for Balanced Parentheses
S β†’ Ξ΅ | SS | (S)

This grammar generates all strings of balanced parentheses. The rules allow for empty string (Ξ΅), concatenation of balanced strings (SS), and nesting within parentheses ((S)).

Converting CFG to Chomsky Normal Form
  1. Eliminate Ξ΅-productions (except from start variable if needed)
  2. Eliminate unit productions (A β†’ B)
  3. Replace terminals in mixed productions with new variables
  4. Break down productions with more than 2 variables on RHS
  5. Result: All productions of form A β†’ BC or A β†’ a

Pushdown Automaton (PDA)

A PDA is a 6-tuple M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, F) where:

  • Q is a finite set of states
  • Ξ£ is the input alphabet
  • Ξ“ is the stack alphabet
  • Ξ΄: Q Γ— Ξ£_Ξ΅ Γ— Ξ“_Ξ΅ β†’ P(Q Γ— Ξ“_Ξ΅) is the transition function
  • qβ‚€ ∈ Q is the start state
  • F βŠ† Q is the set of accept states
Pumping Lemma for Context-Free Languages

If L is a context-free language, then there exists p > 0 such that for any string s ∈ L with |s| β‰₯ p, s can be written as s = uvwxy where:

  • |vwx| ≀ p
  • |vx| > 0 (at least one of v, x is non-empty)
  • For all i β‰₯ 0, uv^i wx^i y ∈ L

Turing Machines & Computability

The ultimate model of computation

βš™οΈ

Turing Machine Model

The most powerful model of computation, equivalent to any algorithm that can be computed. Foundation for understanding computability limits.

πŸ›‘

Halting Problem

The famous undecidable problem: determining whether a given program will halt on a specific input. Proof of computational limits.

πŸ”

Decidable vs Undecidable

Classification of problems into those that can be solved algorithmically (decidable) and those that cannot (undecidable).

Turing Machine Definition

A Turing Machine is a 7-tuple M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, q_accept, q_reject) where:

  • Q is a finite set of states
  • Ξ£ is the input alphabet (not containing blank symbol)
  • Ξ“ is the tape alphabet (Ξ£ βŠ‚ Ξ“, blank ∈ Ξ“)
  • Ξ΄: Q Γ— Ξ“ β†’ Q Γ— Ξ“ Γ— {L,R} is the transition function
  • qβ‚€ ∈ Q is the start state
  • q_accept ∈ Q is the accept state
  • q_reject ∈ Q is the reject state (β‰  q_accept)
Example: Turing Machine for Language {0^n 1^n | n β‰₯ 0}
  1. Scan right, mark first 0 as X, continue to first 1
  2. Mark first 1 as Y, return to leftmost X
  3. Repeat until no more 0s or 1s to mark
  4. Scan to check all symbols are marked
  5. Accept if string has equal 0s and 1s in correct order
Church-Turing Thesis

The informal statement that the class of functions computable by a Turing Machine corresponds exactly to the class of functions that we would intuitively consider to be "computable" by an algorithm.

Rice's Theorem

For any non-trivial property P of the language recognized by Turing machines, the problem of determining whether a given Turing machine recognizes a language with property P is undecidable.

Computational Complexity Theory

Analyzing the resources required for computation

⏱️

P vs NP Problem

The most famous open problem in computer science: whether every problem whose solution can be verified in polynomial time can also be solved in polynomial time.

🧩

NP-Complete Problems

The hardest problems in NP class. If any NP-complete problem can be solved in polynomial time, then P = NP.

πŸ“Š

Complexity Classes

Hierarchy of complexity classes: P, NP, PSPACE, EXPTIME, and their relationships and separations.

Complexity Class Definitions

  • P: Problems solvable in polynomial time by deterministic TM
  • NP: Problems verifiable in polynomial time (or solvable in polynomial time by non-deterministic TM)
  • NP-Complete: Problems in NP that are as hard as any problem in NP
  • NP-Hard: Problems at least as hard as NP-complete problems (may not be in NP)
  • PSPACE: Problems solvable using polynomial space
Classic NP-Complete Problems
  • SAT (Satisfiability): Can a Boolean formula be satisfied?
  • 3-SAT: SAT restricted to clauses with exactly 3 literals
  • Clique: Does a graph contain a clique of size k?
  • Vertex Cover: Is there a vertex cover of size k?
  • Hamiltonian Path: Does a graph have a Hamiltonian path?
  • TSP (Traveling Salesman): Shortest tour visiting all cities
Cook-Levin Theorem

The Boolean satisfiability problem (SAT) is NP-complete. This was the first problem proven to be NP-complete and forms the basis for proving other problems NP-complete through polynomial-time reductions.

P βŠ† NP βŠ† PSPACE βŠ† EXPTIME

Practice Problems & Solutions

Test your understanding with these problems

Regular Languages Problems

Problem 1:

Construct a DFA that accepts strings over {0,1} where the number of 1s is odd and the number of 0s is even.

Show Solution

We need 4 states to track parity of both 0s and 1s:

  • qβ‚€β‚€: even 0s, even 1s (start state)
  • q₀₁: even 0s, odd 1s (final state)
  • q₁₀: odd 0s, even 1s
  • q₁₁: odd 0s, odd 1s
Problem 2:

Prove that the language L = {0ⁿ1ⁿ | n β‰₯ 0} is not regular using the pumping lemma.

Show Solution

Proof by contradiction:

  1. Assume L is regular with pumping length p
  2. Consider string s = 0α΅–1α΅– ∈ L, |s| = 2p β‰₯ p
  3. By pumping lemma, s = xyz where |xy| ≀ p, |y| > 0
  4. Since |xy| ≀ p, y consists only of 0s
  5. Pumping: xyΒ²z has more 0s than 1s, so xyΒ²z βˆ‰ L
  6. Contradiction! Therefore L is not regular.

Context-Free Languages Problems

Problem 3:

Write a CFG for the language L = {aⁱbΚ²cᡏ | i = j or j = k, i,j,k β‰₯ 0}.

Show Solution
S β†’ S₁ | Sβ‚‚
S₁ β†’ AB | BA
A β†’ aAb | Ξ΅
B β†’ cB | Ξ΅
Sβ‚‚ β†’ CB | BC
C β†’ aC | Ξ΅
B β†’ bBc | Ξ΅

Turing Machines Problems

Problem 4:

Design a Turing machine that computes the function f(n) = n + 1 where n is represented in unary (as 1ⁿ).

Show Solution

Algorithm:

  1. Start at leftmost 1
  2. Move right to find the blank after last 1
  3. Write 1 on the blank
  4. Move left to beginning and accept

Complexity Theory Problems

Problem 5:

Prove that CLIQUE is NP-complete by showing: (1) CLIQUE ∈ NP, (2) 3-SAT β‰€β‚š CLIQUE.

Show Solution

(1) CLIQUE ∈ NP: Given graph G and integer k, we can verify a clique of size k in polynomial time by checking all edges between claimed clique vertices.

(2) 3-SAT β‰€β‚š CLIQUE: For each clause in 3-SAT formula, create vertices for each literal. Connect vertices if they're from different clauses and don't contradict each other. A k-clique exists iff the 3-SAT formula with k clauses is satisfiable.

Complete Exam Preparation

Everything you need to excel in TOC exams

πŸ“‹

Formula Sheet

Essential formulas, definitions, and theorems for quick reference during exams and problem solving.

πŸ†

Previous Year Papers

Solved previous exam questions with detailed explanations and marking schemes.

⏱️

Mock Tests

Timed practice tests with instant feedback and performance analysis.

πŸ’‘

Problem-Solving Tricks

Expert tips and shortcuts for common problem patterns and construction techniques.

Essential Formula Sheet

Regular Languages

  • DFA: M = (Q, Ξ£, Ξ΄, qβ‚€, F), Ξ΄: Q Γ— Ξ£ β†’ Q
  • NFA: M = (Q, Ξ£, Ξ΄, qβ‚€, F), Ξ΄: Q Γ— Ξ£_Ξ΅ β†’ P(Q)
  • Pumping Lemma: |y| > 0, |xy| ≀ p, xy^i z ∈ L βˆ€i β‰₯ 0
  • Closure: βˆͺ, ∩, -, *, concatenation, reversal

Context-Free Languages

  • CFG: G = (V, Ξ£, R, S)
  • PDA: M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, Zβ‚€, F), Ξ΄: Q Γ— Ξ£_Ξ΅ Γ— Ξ“_Ξ΅ β†’ P(Q Γ— Ξ“_Ξ΅)
  • Pumping Lemma: |vwx| ≀ p, |vx| > 0, uv^i wx^i y ∈ L βˆ€i β‰₯ 0
  • CNF: A β†’ BC or A β†’ a

Turing Machines

  • TM: M = (Q, Ξ£, Ξ“, Ξ΄, qβ‚€, q_accept, q_reject)
  • Ξ΄: Q Γ— Ξ“ β†’ Q Γ— Ξ“ Γ— {L,R}
  • Church-Turing Thesis: TM ≑ Algorithm

Complexity Classes

  • P: ⋃_k TIME(n^k)
  • NP: Polynomial-time verifiable
  • PSPACE: ⋃_k SPACE(n^k)
  • Hierarchy: P βŠ† NP βŠ† PSPACE βŠ† EXPTIME

Common Exam Question Types

Type 1: Automata Construction (20-30 marks)

  • Construct DFA/NFA for given language description
  • Convert NFA to DFA using subset construction
  • Design PDA for context-free languages
  • Write Turing machine for computations

Type 2: Formal Proofs (15-25 marks)

  • Use pumping lemma to prove non-regularity
  • Prove undecidability using reductions
  • Show NP-completeness of problems
  • Demonstrate closure properties

Type 3: Grammar and Language Theory (15-20 marks)

  • Design CFG for given languages
  • Convert CFG to normal forms
  • Determine language classes (regular, CF, etc.)
  • Parse strings using given grammars

Type 4: Complexity Analysis (10-15 marks)

  • Classify problems into complexity classes
  • Perform polynomial-time reductions
  • Analyze time and space complexity
  • Understand P vs NP implications

Expert Problem-Solving Strategies

DFA Construction Strategy

  1. Identify key patterns: What properties must be tracked?
  2. Determine states: Each state represents what you've "remembered"
  3. Define transitions: How does memory change with each input?
  4. Mark final states: Which memories indicate acceptance?
  5. Test with examples: Trace through accept/reject cases

Pumping Lemma Strategy

  1. Assume regularity: Let p be the pumping length
  2. Choose clever string: Pick s ∈ L with |s| β‰₯ p that will cause problems when pumped
  3. Consider all decompositions: Use constraints |xy| ≀ p, |y| > 0
  4. Find contradiction: Show xy^i z βˆ‰ L for some i
  5. Common choices: 0^p1^p, a^p!+p, 0^p where p is prime

NP-Completeness Proof Strategy

  1. Show L ∈ NP: Describe polynomial-time verifier
  2. Choose known NP-complete problem: Usually 3SAT, CLIQUE, or VERTEX-COVER
  3. Design reduction: Transform instances efficiently
  4. Prove correctness: Show equivalence both directions
  5. Analyze complexity: Verify reduction runs in polynomial time

Comprehensive Study Checklist

Unit 1: Regular Languages

Master DFA construction for string patterns (prefixes, suffixes, substrings)
Convert NFA to DFA using subset construction algorithm
Design regular expressions and convert to/from automata
Apply pumping lemma to prove non-regularity
Use closure properties for language constructions

Unit 2: Context-Free Languages

Design CFG for nested structures and arithmetic expressions
Convert CFG to Chomsky Normal Form systematically
Construct PDA for context-free languages
Apply CFL pumping lemma to prove non-context-freeness
Understand ambiguity and parse tree construction

Unit 3: Turing Machines & Computability

Design TM for arithmetic operations and string processing
Understand Church-Turing thesis and its implications
Prove decidability of specific language problems
Master diagonalization proof technique
Use reductions to show undecidability
Apply Rice's theorem to language properties

Unit 4: Complexity Theory

Understand relationships between P, NP, PSPACE
Design polynomial-time verifiers for NP problems
Perform polynomial-time reductions correctly
Prove NP-completeness for new problems
Understand Cook-Levin theorem and its significance
Analyze space complexity and hierarchy theorems
Final Exam Tips & Time Management

Before the Exam:

  • Practice constructions by hand - Don't rely only on understanding
  • Memorize key theorem statements - Pumping lemmas, Rice's theorem, etc.
  • Review common reduction patterns - 3SAT reductions, decidability reductions
  • Solve previous years completely - Time yourself and check solutions

During the Exam:

  • Read all questions first - Plan time allocation (40% construction, 35% proofs, 25% theory)
  • Start with easiest constructions - Build confidence and save hard proofs for later
  • Show all steps clearly - Partial credit is significant in TOC
  • Verify constructions with examples - Test your DFA/TM with sample strings
  • For proofs: state assumptions clearly - Follow standard proof formats

Common Mistakes to Avoid:

  • ❌ Confusing DFA and NFA transition functions
  • ❌ Incorrect pumping lemma string choices
  • ❌ Missing Ξ΅-transitions in NFA/PDA
  • ❌ Incomplete reduction proofs (missing both directions)
  • ❌ Mixing up complexity classes (P vs NP)
Success Strategy

Practice consistently rather than cramming. Work through problems systematically, understand the underlying concepts, and build your intuition through multiple examples. Focus on problem-solving techniques rather than just memorizing solutions. Remember: TOC is about logical reasoning and formal methods - these skills improve with practice!